Improved fast Gauss transform User manual
نویسندگان
چکیده
In most kernel based machine learning algorithms and non-parametric statistics the key computational task is to compute a linear combination of local kernel functions centered on the training data, i.e., f(x) = ∑N i=1 qik(x, xi), which is the discrete Gauss transform for the Gaussian kernel. f is the regression/classification function in case of regularized least squares, Gaussian process regression, support vector machines, kernel regression, and radial basis function neural networks. For nonparametric density estimation it is the kernel density estimate. Also many kernel methods like kernel principal component analysis and spectral clustering algorithms involve computing the eigen values of the Gram matrix. Training Gaussian process machines involves the solution of a linear system of equations. Solutions to such problems can be obtained using iterative methods, where the dominant computation is evaluation of f(x).
منابع مشابه
Efficient Kernel Machines Using the Improved Fast Gauss Transform
The computation and memory required for kernel machines with N training samples is at least O(N). Such a complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the computation to O(N). We also give an error bound for the approximation, and provide experimental res...
متن کاملInsights on Fast Kernel Density Estimation Algorithms
We present results of experiments testing the Fast Gauss Transform, Improved Fast Gauss Transform, and Dual-Tree methods (using kd-tree and Anchors Hierarchy data structures) for fast Kernel Density Estimation (KDE). We examine the performance of these methods with respect to data set size, dimension, allowable error, and data set structure (“clumpiness”), measured in terms of CPU time and memo...
متن کاملEmpirical Testing of Fast Kernel Density Estimation Algorithms
We present results of experiments testing the Fast Gauss Transform, Improved Fast Gauss Transform, and Dual-Tree methods (using kd-tree and Anchors Hierarchy data structures) for fast Kernel Density Estimation (KDE). We examine the performance of these methods with respect to data set size, dimension, allowable error, and data set structure (“clumpiness”), measured in terms of CPU time and memo...
متن کاملThe fast Gauss transform with complex parameters q
We construct a fast method, OðN logNÞ, for the computation of discrete Gauss transforms with complex parameters, capable of dealing with unequally spaced grid points. The method is based on Fourier techniques, and in particular it makes use of a modified unequally spaced fast Fourier transform algorithm, in combination with previously suggested divide and conquer strategies for ordinary fast Ga...
متن کاملImproved Fast Gauss Transform and Efficient Kernel Density Estimation
Evaluating sums of multivariate Gaussians is a common computational task in computer vision and pattern recognition, including in the general and powerful kernel density estimation technique. The quadratic computational complexity of the summation is a significant barrier to the scalability of this algorithm to practical applications. The fast Gauss transform (FGT) has successfully accelerated ...
متن کامل